1,007 research outputs found

    From homogeneous to fractal normal and tumorous microvascular networks in the brain

    Get PDF
    We studied normal and tumorous three-dimensional (3D) microvascular networks in primate and rat brain. Tissues were prepared following a new preparation technique intended for high-resolution synchrotron tomography of microvascular networks. The resulting 3D images with a spatial resolution of less than the minimum capillary diameter permit a complete description of the entire vascular network for volumes as large as tens of cubic millimeters. The structural properties of the vascular networks were investigated by several multiscale methods such as fractal and power- spectrum analysis. These investigations gave a new coherent picture of normal and pathological complex vascular structures. They showed that normal cortical vascular networks have scale- invariant fractal properties on a small scale from 1.4 lm up to 40 to 65 lm. Above this threshold, vascular networks can be considered as homogeneous. Tumor vascular networks show similar characteristics, but the validity range of the fractal regime extend to much larger spatial dimensions. These 3D results shed new light on previous two dimensional analyses giving for the first time a direct measurement of vascular modules associated with vessel-tissue surface exchange

    The Concept of the Machining Surface in 5-Axis Milling of Free-form Surfaces

    Get PDF
    International audienceThe concept of the machining surface (MS) is a new approach to the process of design and manufacturing of free form surfaces. The machining surface is the surface representation of the tool path, integrating functional design specifications and machining constraints. By definition, the machining surface is a surface including all the information necessary for the driving of the tool, so that the envelope surface of the tool movement sweeping the MS gives the expected free-form. In this paper, we study the building of the MS for 5-axis end milling with usual cutting tools, ball, flat and filleted endmill. We make so that the design and manufacturing constraints taken into account by the machining surface are completely uncoupled within the MS.Le concept de la surface d'usinage apporte une évolution dans le processus de réalisation des piÚces de forme gauche au ni-veau de la conception et de la fabrication. La surface d'usinage est une représentation surfacique du trajet de l'outil intégrant les contrain-tes fonctionnelles de conception ainsi que les contraintes technologiques de fabrication. Par définition, la surface d'usinage guide un point fixe de l'outil de telle sorte que la surface enveloppe du mouvement de l'outil soit la surface attendue. Nous présentons dans cet article comment se construit la surface d'usinage pour le fraisage à cinq axes en bout avec les outils de coupe couramment utilisés. Nous faisons en sorte que les aspects conception et fabrication pris en compte par la surface d'usinage soient totalement découplés afin de conserver ces activités indépendantes

    Bringing Theory Closer to Practice in Post-quantum and Leakage-resilient Cryptography

    Get PDF
    Modern cryptography pushed forward the need of having provable security. Whereas ancient cryptography was only relying on heuristic assumptions and the secrecy of the designs, nowadays researchers try to make the security of schemes to rely on mathematical problems which are believed hard to solve. When doing these proofs, the capabilities of potential adversaries are modeled formally. For instance, the black-box model assumes that an adversary does not learn anything from the inner-state of a construction. While this assumption makes sense in some practical scenarios, it was shown that one can sometimes learn some information by other means, e.g., by timing how long the computation take. In this thesis, we focus on two different areas of cryptography. In both parts, we take first a theoretical point of view to obtain a result. We try then to adapt our results so that they are easily usable for implementers and for researchers working in practical cryptography. In the first part of this thesis, we take a look at post-quantum cryptography, i.e., at cryptographic primitives that are believed secure even in the case (reasonably big) quantum computers are built. We introduce HELEN, a new public-key cryptosystem based on the hardness of the learning from parity with noise problem (LPN). To make our results more concrete, we suggest some practical instances which make the system easily implementable. As stated above, the design of cryptographic primitives usually relies on some well-studied hard problems. However, to suggest concrete parameters for these primitives, one needs to know the precise complexity of algorithms solving the underlying hard problem. In this thesis, we focus on two recent hard-problems that became very popular in post-quantum cryptography: the learning with error (LWE) and the learning with rounding problem (LWR). We introduce a new algorithm that solves both problems and provide a careful complexity analysis so that these problems can be used to construct practical cryptographic primitives. In the second part, we look at leakage-resilient cryptography which studies adversaries able to get some side-channel information from a cryptographic primitive. In the past, two main disjoint models were considered. The first one, the threshold probing model, assumes that the adversary can put a limited number of probes in a circuit. He then learns all the values going through these probes. This model was used mostly by theoreticians as it allows very elegant and convenient proofs. The second model, the noisy-leakage model, assumes that every component of the circuit leaks but that the observed signal is noisy. Typically, some Gaussian noise is added to it. According to experiments, this model depicts closely the real behaviour of circuits. Hence, this model is cherished by the practical cryptographic community. In this thesis, we show that making a proof in the first model implies a proof in the second model which unifies the two models and reconciles both communities. We then look at this result with a more practical point-of-view. We show how it can help in the process of evaluating the security of a chip based solely on the more standard mutual information metric

    HELEN: a Public-key Cryptosystem Based on the LPN Problem (Extended Abstract)

    Get PDF
    We propose HELEN, a new code-based public-key cryptosystem whose security is based on the hardness of the Learning from Parity with Noise problem~(LPN) and the decisional minimum distance problem. We show that the resulting cryptosystem achieves indistinguishability under chosen plaintext attacks (IND-CPA security). Using the Fujisaki-Okamoto generic construction, HELEN achieves IND-CCA security in the random oracle model. We further propose concrete parameters

    Transformers for 1D Signals in Parkinson's Disease Detection from Gait

    Full text link
    This paper focuses on the detection of Parkinson's disease based on the analysis of a patient's gait. The growing popularity and success of Transformer networks in natural language processing and image recognition motivated us to develop a novel method for this problem based on an automatic features extraction via Transformers. The use of Transformers in 1D signal is not really widespread yet, but we show in this paper that they are effective in extracting relevant features from 1D signals. As Transformers require a lot of memory, we decoupled temporal and spatial information to make the model smaller. Our architecture used temporal Transformers, dimension reduction layers to reduce the dimension of the data, a spatial Transformer, two fully connected layers and an output layer for the final prediction. Our model outperforms the current state-of-the-art algorithm with 95.2\% accuracy in distinguishing a Parkinsonian patient from a healthy one on the Physionet dataset. A key learning from this work is that Transformers allow for greater stability in results. The source code and pre-trained models are released in https://github.com/DucMinhDimitriNguyen/Transformers-for-1D-signals-in-Parkinson-s-disease-detection-from-gait.gitComment: International Conference on Pattern Recognition (ICPR 2022

    Is there more room to negotiate with the IMF on fiscal policy?

    Full text link
    This repository item contains a working paper from the Boston University Global Economic Governance Initiative. The Global Economic Governance Initiative (GEGI) is a research program of the Center for Finance, Law & Policy, the Frederick S. Pardee Center for the Study of the Longer-Range Future, and the Frederick S. Pardee School of Global Studies. It was founded in 2008 to advance policy-relevant knowledge about governance for financial stability, human development, and the environment.During the 1980s the IMF emerged as a global “bad cop,” demanding harsh austerity measures in countries faced with debt problems. Has the Great Recession changed all that? Is there more room to negotiate with the Fund on fiscal policy? The answer is yes. If we take a close look at what the IMF researchers say and what its most influential official reports proclaim, then we can see that there has been a more “Keynesian” turn at the Fund. This means that today one can find arguments for less austerity, more growth measures and a fairer social distribution of the burden of fiscal sustainability. The IMF has experience a major thaw of its fiscal policy doctrine and well‐informed member states can use this to their advantage. These changes do not amount to a paradigm shift, a la Paul Krugman’s ideas. Yet crisis‐ridden countries that are keen to avoid punishing austerity packages can exploit this doctrinal shift by exploring the policy implications of the IMF’s own official fiscal doctrine and staff research. They can cut less spending, shelter the most disadvantaged, tax more at the top of income distribution and think twice before rushing into a fast austerity package. This much is clear in all of the Fund’s World Economic Outlooks and Global Fiscal Monitors published between 2009 and 2013 with regard to four themes: the main goals of fiscal policy, the basic options for countries with fiscal/without fiscal space, the pace of fiscal consolidation, and the composition of fiscal stimulus and consolidation

    Correspondencia inédita

    Get PDF

    DiAE: Re-rolling the DiSE

    Get PDF
    The notion of distributed authenticated encryption was formally introduced by Agrawal et al. in ACM CCS 2018. In their work, they propose the DiSE construction building upon a distributed PRF (DPRF), a commitment scheme and a PRG. We show that most of their constructions do not meet some of the claimed security guarantees. In fact, all the concrete instantiations of DiSE, as well as multiple follow-up papers (one accepted at ACM CCS 2021), fail to satisfy their strongly-secure definitions. We give simple fixes for these constructions and prove their security. We also propose a new construction DiAE using an encryptment instead of a commitment. This modification dispenses with the need to buffer the entire message throughout the encryption protocol, which in turn enables implementations with constant RAM footprint and online message encryption. This is particularly interesting for constrained IoT devices. Finally, we implement and benchmark DiAE and show that it performs similarly to the original DiSE construction

    KRAB zinc finger protein ZNF676 controls the transcriptional influence of LTR12-related endogenous retrovirus sequences.

    Get PDF
    BACKGROUND: Transposable element-embedded regulatory sequences (TEeRS) and their KRAB-containing zinc finger protein (KZFP) controllers are increasingly recognized as modulators of gene expression. We aim to characterize the contribution of this system to gene regulation in early human development and germ cells. RESULTS: Here, after studying genes driven by the long terminal repeat (LTR) of endogenous retroviruses, we identify the ape-restricted ZNF676 as the sequence-specific repressor of a subset of contemporary LTR12 integrants responsible for a large fraction of transpochimeric gene transcripts (TcGTs) generated during human early embryogenesis. We go on to reveal that the binding of this KZFP correlates with the epigenetic marking of these TEeRS in the germline, and is crucial to the control of genes involved in ciliogenesis/flagellogenesis, a biological process that dates back to the last common ancestor of eukaryotes. CONCLUSION: These results illustrate how KZFPs and their TE targets contribute to the evolutionary turnover of transcription networks and participate in the transgenerational inheritance of epigenetic traits
    • 

    corecore